Computing Divergences between Discrete Decomposable Models

نویسندگان

چکیده

There are many applications that benefit from computing the exact divergence between 2 discrete probability measures, including machine learning. Unfortunately, in absence of any assumptions on structure or independencies within these distributions, them is an intractable problem high dimensions. We show we able to compute a wide family functionals and divergences, such as alpha-beta divergence, two decomposable models, i.e. chordal Markov networks, time exponential treewidth models. The divergences include popular Kullback-Leibler Hellinger distance, chi-squared divergence. Thus, can accurately values this broad class extent which model distributions using

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximating discrete probability distributions with decomposable models

A heuristic procedure is presented to approximate an 71dimensional discrete probability distribution with a decomposable model of a given complexity. It is shown that, without loss of generality, the search space can be restricted to a suitable subclass of decomposable models, whose members are called elementary models. The selected elementary model is constructed in an incremental manner accor...

متن کامل

Learning discrete decomposable graphical models via constraint optimization

Statistical model learning problems are traditionally solved using either heuristic greedy optimization or stochastic simulation, such as Markov chain Monte Carlo or simulated annealing. Recently, there has been an increasing interest in the use of combinatorial search methods, including those based on computational logic. Some of these methods are particularly attractive since they can also be...

متن کامل

Alternative parametrizations and reference priors for decomposable discrete graphical models

For a given discrete decomposable graphical model, we identify several alternative parametrizations, and construct the corresponding reference priors for suitable groupings of the parameters. Specifically, assuming that the cliques of the graph are arranged in a perfect order, the parameters we consider are conditional probabilities of clique-residuals given separators, as well as generalized l...

متن کامل

Statistical Inference with Unnormalized Discrete Models and Localized Homogeneous Divergences

In this paper, we focus on parameters estimation of probabilistic models in discrete space. A naive calculation of the normalization constant of the probabilistic model on discrete space is often infeasible and statistical inference based on such probabilistic models has difficulty. In this paper, we propose a novel estimator for probabilistic models on discrete space, which is derived from an ...

متن کامل

Decomposable Log-linear Models Decomposable Log-linear Models

The present paper considers discrete probability models with exact computational properties. In relation to contingency tables this means closed form expressions of the maksimum likelihood estimate and its distribution. The model class includes what is known as decomposable graphical models, which can be characterized by a structured set of conditional independencies between some variables give...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i10.26443